Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add filters

Database
Language
Document Type
Year range
1.
Multimed Tools Appl ; 81(26): 37351-37377, 2022.
Article in English | MEDLINE | ID: covidwho-1935849

ABSTRACT

The year 2020 and 2021 was the witness of Covid 19 and it was the leading cause of death throughout the world during this time period. It has an impact on a large geographic area, particularly in countries with a large population. Due to the fact that this novel coronavirus has been detected in all countries around the world, the World Health Organization (WHO) has declared Covid-19 to be a pandemic. This novel coronavirus spread quickly from person to person through the saliva droplets and direct or indirect contact with an infected person. The tests carried out to detect the Covid-19 are time-consuming and the primary cause of rapid growth in Covid19 cases. Early detection of Covid patient can play a significant role in controlling the Covid chain by isolation the patient and proper treatment at the right time. Recent research on Covid-19 claim that Chest CT and X-ray images can be used as the preliminary screening for Covid-19 detection. This paper suggested an Artificial Intelligence (AI) based approach for detecting Covid-19 by using X-ray and CT scan images. Due to the availability of the small Covid dataset, we are using a pre-trained model. In this paper, four pre-trained models named VGGNet-19, ResNet50, InceptionResNetV2 and MobileNet are trained to classify the X-ray images into the Covid and Normal classes. A model is tuned in such a way that a smaller percentage of Covid cases will be classified as Normal cases by employing normalization and regularization techniques. The updated binary cross entropy loss (BCEL) function imposes a large penalty for classifying any Covid class to Normal class. The experimental results reveal that the proposed InceptionResNetV2 model outperforms the other pre-trained model with training, validation and test accuracy of 99.2%, 98% and 97% respectively.

SELECTION OF CITATIONS
SEARCH DETAIL